-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: Add project-level README #103
Conversation
Code Coverage Summary
Diff against main
Results for commit: fd0b936 Minimum allowed coverage is ♻️ This comment has been updated with latest results |
CONTRIBUTING.md
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd follow the format here from a contributing guide from libs such as datasets. I'd change the title to “How to contribute to Ragbits?”, add a small description of how we value community input - this can be copied from hf datasets
, and then the two most important sections - “How to work on an open Issue?" and "How to create a Pull Request?” with similar content to datasets
adjusted to our workflow.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just content saved from the old README with things specific to this project. Frankly, I don't see the point of having a git tutorial there - maybe if we expected some not-very-technical people to contribute (but we don't). I think in our case it would mostly be a distraction from the important project-specific parts.
Having a "How to work on an open Issue?" issue seems okay, but would require us to have such flow discussed and ready, including using those tags described there. I don't think we are at this stage yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair, we can iterate on it later
pip install ragbits | ||
``` | ||
|
||
Alternatively, you can use individual components of the stack by installing their respective packages: `ragbits-core`, `ragbits-document-search`, `ragbits-cli`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add maybe one more command snippet with pip install ragbits-...
for each package with a comment "Adjust to your needs" or something like that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't get it. You want a separate snippet for each package? We will have a lot of packages soon, so I don't think that would be a good idea
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking more of something like this:
pip install ragbits-cli
pip install ragbits-core
pip install ragbits-document-search
other packages as needed...
We don't really need to update this list, but only show the user an alternative way of installation. But I don't have strong preference here, we can skip it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I assume that each package will have its own package-specific readme (linked from the list of packages above) and the package-specific installation instructions will be there. Does this sounds ok to you?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds great
README.md
Outdated
@@ -1,32 +1,86 @@ | |||
# Ragbits | |||
|
|||
Repository for internal experiment with our upcoming LLM framework. | |||
*A stack of ready-to-use libraries for building chatbots and other LLM-powered applications* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's stick to the description from the repo - "Building blocks for rapid development of GenAI applications"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I will, although personally I'm not a fun of marketing slogans that has a lot of buzzwords but fail to explain what the thing actually is. Especially coupled with the fact that we don't have any longer explanation in this README, as a user I would just click away
Trivy scanning results. .venv/lib/python3.10/site-packages/PyJWT-2.9.0.dist-info/METADATA (secrets)Total: 1 (MEDIUM: 1, HIGH: 0, CRITICAL: 0) MEDIUM: JWT (jwt-token) .venv/lib/python3.10/site-packages/litellm/llms/huggingface_llms_metadata/hf_text_generation_models.txt (secrets)Total: 1 (MEDIUM: 0, HIGH: 0, CRITICAL: 1) CRITICAL: HuggingFace (hugging-face-access-token) .venv/lib/python3.10/site-packages/litellm/proxy/_types.py (secrets)Total: 1 (MEDIUM: 1, HIGH: 0, CRITICAL: 0) MEDIUM: Slack (slack-web-hook) |
No description provided.